-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Feat: add gpt 5 models and verbosity param #4086
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@sobychacko @markpollack updating the latest changes from 1a3ed95 Gpt-5 models: https://platform.openai.com/docs/models |
* <p> | ||
* Note: GPT-5 models require temperature=1.0 (default value). Custom temperature | ||
* values are not supported and will cause errors. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this not the case anymore? In that case, we need to update the docs changed in this commit: https://github.com/spring-projects/spring-ai/pull/4068/files
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank youi for pointing out this one @sobychacko . it is the case for all gpt 5 models but not the gpt-5-chat model . https://community.openai.com/t/temperature-in-gpt-5-models/1337133/25
I have update documentation and added tests as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it mean that the gpt 5 models default to a value of temperature 1.0
? If so, I think we should make that clear in the docs or javadocs. From what I hear from you, the GPT 5 models completely ignore temperature and internally default to a value of 1.0
. Is that correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is correct. Azure mentions also more params that are not supported, but I am am not sure if that info has to be on spring ai documentation as it is not a limitation of the framework but from the LLM model itself.
The following are currently unsupported with reasoning models:
temperature, top_p, presence_penalty, frequency_penalty, logprobs, top_logprobs, logit_bias, max_tokens
- update gpt-5 tests - add verbosity parameter - udpate documentation and add tests Signed-off-by: Alexandros Pappas <[email protected]>
37a5830
to
826175b
Compare
- update gpt-5 tests - add verbosity parameter - udpate documentation and add tests Fixes #4086 Signed-off-by: Alexandros Pappas <[email protected]> (cherry picked from commit fef454b)
Add gpt-5 models and update their tests